A Gradient-Based Particle-Bat Algorithm for Stochastic Configuration Network
نویسندگان
چکیده
Stochastic configuration network (SCN) is a mathematical model of incremental generation under supervision mechanism, which has universal approximation property and advantages in data modeling. However, the efficiency SCN affected by some parameters. An optimized searching algorithm for input weights biases proposed this paper. optimization with constraints first established based on convergence theory inequality mechanism SCN; Then, hybrid bat-particle swarm (G-BAPSO) gradient information framework PSO algorithm, mainly uses local adaptive adjustment characterized pulse emission frequency to improve ability. The optimizes rate network. Simulation results over datasets demonstrate feasibility validity algorithm. training RMSE G-BAPSO-SCN increased 5.57×10−5 3.2×10−3 compared that two regression experiments, recognition accuracy 0.07% average classification experiments.
منابع مشابه
A conjugate gradient based method for Decision Neural Network training
Decision Neural Network is a new approach for solving multi-objective decision-making problems based on artificial neural networks. Using inaccurate evaluation data, network training has improved and the number of educational data sets has decreased. The available training method is based on the gradient decent method (BP). One of its limitations is related to its convergence speed. Therefore,...
متن کاملA Genetic Algorithm for Choice-Based Network Revenue Management
In recent years, enriching traditional revenue management models by considering the customer choice behavior has been a main challenge for researchers. The terminology for the airline application is used as representative of the problem. A popular and an efficient model considering these behaviors is choice-based deterministic linear programming (CDLP). This model assumes that each customer bel...
متن کاملStochastic Particle Gradient Descent for Infinite Ensembles
The superior performance of ensemble methods with infinite models are well known. Most of these methods are based on optimization problems in infinite-dimensional spaces with some regularization, for instance, boosting methods and convex neural networks use L1-regularization with the non-negative constraint. However, due to the difficulty of handling L1-regularization, these problems require ea...
متن کاملOptimal Reconfiguration of Distribution Network for Power Loss Reduction and Reliability Improvement Using Bat Algorithm
In power systems, reconfiguration is one of the simplest and most low-cost methods to reach many goals such as self-healing, reliability improvement, and power loss reduction, without including any additional components. Regarding the expansion of distribution networks, communications become more complicate and the number of parameters increases, which makes the reconfiguration problem infeasib...
متن کاملStochastic Gradient Descent Algorithm in the Computational Network Toolkit
We introduce the stochastic gradient descent algorithm used in the computational network toolkit (CNTK) — a general purpose machine learning toolkit written in C++ for training and using models that can be expressed as a computational network. We describe the algorithm used to compute the gradients automatically for a given network. We also propose a low-cost automatic learning rate selection a...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
ژورنال
عنوان ژورنال: Applied sciences
سال: 2023
ISSN: ['2076-3417']
DOI: https://doi.org/10.3390/app13052878